A general methodology for designing globally convergent optimization neural networks
نویسندگان
چکیده
In this paper, we present a general methodology for designing optimization neural networks. We prove that the neural networks constructed by using the proposed method are guaranteed to be globally convergent to solutions of problems with bounded or unbounded solution sets, in contrast with the gradient methods whose convergence is not guaranteed. We show that the proposed method contains both the gradient methods and nongradient methods employed in existing optimization neural networks as special cases. Based on the theoretical results of the proposed method, we study the convergence and stability of general gradient models in case of unisolated solutions. Using the proposed method, we derive some new neural network models for a very large class of optimization problems, in which the equilibrium points correspond to exact solutions and there is no variable parameter. Finally, some numerical examples show the effectiveness of the method.
منابع مشابه
PROJECTED DYNAMICAL SYSTEMS AND OPTIMIZATION PROBLEMS
We establish a relationship between general constrained pseudoconvex optimization problems and globally projected dynamical systems. A corresponding novel neural network model, which is globally convergent and stable in the sense of Lyapunov, is proposed. Both theoretical and numerical approaches are considered. Numerical simulations for three constrained nonlinear optimization problems a...
متن کاملA Neural Network for Nonlinear Optimization with General Linear Constraints
In this study, we investigate a novel neural network for solving nonlinear convex programming problems with general linear constraints. Furthermore, we extend this neural network to solve a class of variational inequalities problems. These neural networks are stable in the sense of Lyapunov and globally convergent to a unique optimal solution. The present convergence results do not requires Lip...
متن کاملOPTIMUM SHAPE DESIGN OF DOUBLE-LAYER GRIDS BY QUANTUM BEHAVED PARTICLE SWARM OPTIMIZATION AND NEURAL NETWORKS
In this paper, a methodology is presented for optimum shape design of double-layer grids subject to gravity and earthquake loadings. The design variables are the number of divisions in two directions, the height between two layers and the cross-sectional areas of the structural elements. The objective function is the weight of the structure and the design constraints are some limitations on str...
متن کاملAn efficient modified neural network for solving nonlinear programming problems with hybrid constraints
This paper presents the optimization techniques for solving convex programming problems with hybrid constraints. According to the saddle point theorem, optimization theory, convex analysis theory, Lyapunov stability theory and LaSalleinvariance principle, a neural network model is constructed. The equilibrium point of the proposed model is proved to be equivalent to the optima...
متن کاملProjected Dynamical Systems and Optimization Problems
We establish a relationship between general constrained pseudoconvex optimization problems and globally projected dynamical systems. A corresponding novel neural network model, which is globally convergent and stable in the sense of Lyapunov, is proposed. Both theoretical and numerical approaches are considered. Numerical simulations for three constrained nonlinear optimization problems are giv...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید
ثبت ناماگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید
ورودعنوان ژورنال:
- IEEE transactions on neural networks
دوره 9 6 شماره
صفحات -
تاریخ انتشار 1998